32. Lab: Tensorflow CNN

Lab: Tensorflow Convolutional Neural Networks

You are now going to put together everything you have learned and implement a convolutional neural network using Tensorflow. For this lab, you will be working with a brand new dataset!

Fashion-MNIST Dataset

The Fashion-MNIST dataset consists of a training set of 60,000 examples and a test set of 10,000 examples. Each example is a 28x28 grayscale image, associated with a label from the following 10 classes:

  • T-shirt/top
  • Trouser
  • Pullover
  • Dress
  • Coat
  • Sandal
  • Shirt
  • Sneaker
  • Bag
  • Ankle boot

You will be building a CNN based model to classify the images from the test set to the corresponding labels!

Setup

Make sure you have followed the instructions in the classroom to setup your environment.

Clone the Repository and Run the Notebook

Run the commands below to clone the Lab Repository and then run the notebook:

git clone https://github.com/udacity/RoboND-CNN-Lab.git
# Make sure the environment is activated!
jupyter notebook

The jupyter interface will open in your browser. You can then access the cloned repo and the jupyter notebook from there.

Once the notebook is up and running, you can follow the instructions in the notebook and fill out the required pieces of code marked by TODOs.

This is a self-assessed lab. Feel free to compare your results and discuss on improvements with your fellow students in the Slack!

Recommendations

There are several pieces that come into play when creating your Deep Learning model. The kind of layers, number of layers, weights, biases, tunable hyperparameters etc. play an important part in building a robust model for a specific use-case. There's a lot to explore in this lab as a result.

We suggest you start off with a simple implementation by using a single convolutional layer with max-pooling, and a single fully-connected layer. Observe the loss and validation accuracy values you obtain. Then slowly refine your model by adding more layers, dropouts for regularization, tuning your hyperparameters etc. to achieve a good, high level of accuracy on your validation and test sets.